|
Evolutionary operant behavior learning model and its application to mobile robot obstacle avoidance
GAO Yuanyuan ZHU Fan SONF Hongjun
Journal of Computer Applications
2013, 33 (08):
2283-2288.
To solve the problem of poor self-adaptive ability in the robot obstacle avoidance, combined with evolution thought of Genetic Algorithm (GA), an Evolutionary Operant Behavior Learning Model (EOBLM) was proposed for the mobile robot learning obstacle avoidance in unknown environment, which was based on Operant Conditioning (OC) and Adaptive Heuristic Critic (AHC) learning. The proposed model was a modified version of the AHC learning architecture. Adaptive Critic Element (ACE) network was composed of a multi-layer feedforward network and the learning was enhanced by TD(λ) algorithm and gradient descent algorithm. A tropism mechanism was designed in this stage as intrinsic motivation and it could direct the orientation of the Agent learning. Adaptive Selection Element (ASE) network was used to optimize operant behavior to achieve the best mapping from state to actor. The optimizing process has two stages. At the first stage, the information entropy got by OC learning algorithm was used as individual fitness to search the optimal individual with executing the GA learning. At the second stage, the OC learning selected the optimal operation behavior within the optimal individual and got new information entropy. The results of experiments on obstacle avoidance show that the method endows the mobile robot with the capabilities of learning obstacle avoidance actively for path planning through interaction with the environment constantly. The results were compared with the traditional AHC learning algorithm, and the proposed model had better performance on self-learning and self-adaptive abilities.
Related Articles |
Metrics
|
|